A Performance Analysis of Delta and Huffman Compression Algorithms
نویسنده
چکیده
With the recent trend in Information and Communication Technology, Storage and Transfer of data and Information are two vital issues which have Cost and Speed implication respectively. Large volume of data (text or image) is constantly being processed on the internet or on a Personal Computer, which has led to the Upgrade of current System. Hence the need for compression, which reduces storage capacity and effect Speed of transfer. Data Compression is the act of reducing the size of a file by minimizing redundant data. In a text file, redundant data can be frequently occurring characters or common vowels. This research involves a comparative performance analysis of Huffman and Delta Compression schemes. A compression program is used to convert data from an easy-to-use format (ASCII) to one optimized for compactness. Huffman and Delta algorithms were implemented using C#. Result was also presented on the efficiency of the former based on three parameters: the number of bit, compression ratio and percentage of compression. It was discovered that Huffman algorithm for data compression performs better, since it can store / transmit the least number of bits. The average compression percentage for Huffman and Delta algorithm was found to be 39% and 45% respectively. Which simply implies that for a large text file, Huffman algorithm will achieve a 39% reduction in the file size and as such increase the capacity of the storage medium.
منابع مشابه
فشرده سازی اطلاعات متغیر با زمان با استفاده از کد هافمن
Abstract: In this paper, we fit a function on probability density curve representing an information stream using artificial neural network . This methodology result is a specific function which represent a memorize able probability density curve . we then use the resulting function for information compression by Huffman algorithm . the difference between the proposed me then with the general me...
متن کاملEfficient Representation and Decoding of Static Huffman Code Tables in a Very Low Bit Rate Environment
The lossless entropy coding used in many image coding schemes often is overlooked as most research is based around the lossy stages of image compression. This paper examines the relative merits of using static Huffman coding with a compact optimal table verses more sophisticated adaptive arithmetic methods. For very low bit rate image compression, the computationally simple Huffman method is sh...
متن کاملSuper-Scalar Database Compression between RAM and CPU Cache
Data-intensive query processing tasks like data mining, scientific data analysis, and decision support can leave a database system severely I/O-bound, even when common RAID configurations are used. Traditionally, this problem has been tackled by adding more and more disks, connected through expensive interconnect networks. This brute-force approach results in systems of which the price is domin...
متن کاملEvaluation of Huffman and Arithmetic Algorithms for Multimedia Compression Standards
Compression is a technique to reduce the quantity of data without excessively reducing the quality of the multimedia data.The transition and storing of compressed multimedia data is much faster and more efficient than original uncompressed multimedia data. There are various techniques and standards for multimedia data compression, especially for image compression such as the JPEG and JPEG2000 s...
متن کاملData compression on the sphere
Large data-sets defined on the sphere arise in many fields. In particular, recent and forthcoming observations of the anisotropies of the cosmic microwave background (CMB) made on the celestial sphere contain approximately three and fifty mega-pixels respectively. The compression of such data is therefore becoming increasingly important. We develop algorithms to compress data defined on the sph...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2009